Kernel Dimension Reduction in Regression∗
نویسندگان
چکیده
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from the response Y , given the projection of X on the central subspace [cf. J. Amer. Statist. Assoc. 86 (1991) 316–342 and Regression Graphics (1998) Wiley]. We show that this conditional independence assertion can be characterized in terms of conditional covariance operators on reproducing kernel Hilbert spaces and we show how this characterization leads to an M-estimator for the central subspace. The resulting estimator is shown to be consistent under weak conditions; in particular, we do not have to impose linearity or ellipticity conditions of the kinds that are generally invoked for SDR methods. We also present empirical results showing that the new methodology is competitive in practice.
منابع مشابه
Consistency of regularized sliced inverse regression for kernel models
We develop an extension of the sliced inverse regression (SIR) framework for dimension reduction using kernel models and Tikhonov regularization. The result is a numerically stable nonlinear dimension reduction method. We prove consistency of the method under weak conditions even when the reproducing kernel Hilbert space induced by the kernel is infinite dimensional. We illustrate the utility o...
متن کاملGradient-based kernel dimension reduction for regression
This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on a...
متن کاملRegression on manifolds using kernel dimension reduction
We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads of research. On the one hand, the literature on sufficient dimension reduction has focused on methods for finding the best linear subspace for nonlinear regression; we extend this to manifolds. On the other hand, the l...
متن کاملSliced Coordinate Analysis for Effective Dimension Reduction and Nonlinear Extensions
Sliced inverse regression (SIR) is an important method for reducing the dimensionality of input variables. Its goal is to estimate the effective dimension reduction directions. In classification settings, SIR is closely related to Fisher discriminant analysis. Motivated by reproducing kernel theory, we propose a notion of nonlinear effective dimension reduction and develop a nonlinear extension...
متن کاملAn introduction to dimension reduction in nonparametric kernel regression
Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors and a response variable. However, when the number of predictors is high, nonparametric estimators may suffer from the curse of dimensionality. In this chapter, we show how a dimension reduction method (namely Sliced Inverse Regression) can be combined with nonparametric kernel regression to overc...
متن کاملKernel Inverse Regression for Spatial Random Fields
In this paper, we propose a dimension reduction model for spatially dependent variables. Namely, we investigate an extension of the inverse regression method under strong mixing condition. This method is based on estimation of the matrix of covariance of the expectation of the explanatory given the dependent variable, called the inverse regression. Then, we study, under strong mixing condition,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006